Following up on my article about how privacy works on iOS, I thought it helpful to also give a more opinionated post discussing privacy. I’ll focus on the security of the address book since that is the topic du jour, but these concepts apply more generally as well.
There are 3 technical approaches Apple could take in response to this issue.
- Remove the Feature (No Access): If the data in your address book is considered too sensitive to allow access by 3rd-party developers then Apple could remove the access.
- Add a User Alert (Constrained Access): Pop-up an alert message when an application wishes to access the address book requesting user permission.
- Do Nothing (Open Access): Leave the platform just like it is now. This gives any application, approved in the App Store, full read-write access to your address book.
If the goal of any policy change is to guarantee the security of user data only the first option would be sufficient. Once a developer has access to the data there is no way to realistically ensure it isn’t used for nefarious purposes or in ways that violate user trust. However, such a change would dramatically hurt the platform in terms of user experience and capability.
Adding a user alert before providing access to the data seems superficially to enhance the security of a user’s data by at least asking their permission. However, in reality this approach does little to effect change from the current state.
Look no further than the spread of malware on Android. Here applications have been caught doing all manner of nefarious things with user data. In each case, the user had given the application permission to access their data through the Android Market permissions system.
The problem isn’t that apps are being given access to the user data, the problem is how they then use it. There are dozens of perfectly valid reasons an application could ask for permission to access your address book and then go on to use that data in a way that violates user trust. Whether the user initially gave their permission or not, if the data is used against their desires they will feel betrayed.
I’d argue that Apple already has the best tool it could possibly have for ensuring user privacy — a closed App Store.
Every application developer in the App Store has signed a legal agreement with Apple stating that they will abide by the rules of the Store. If a developer violates these rules their continued presence in the store is in jeopardy. Apple has laid out clear privacy rules that all applications are required to abide by. The best defense against violations in user trust would be Apple being even more vigorous in their defense of user data and enforcement of their existing contractual rights against violators.
While app review cannot be expected to catch every nefarious use of private data (only deep forensic analysis could do that), it can certainly be used to catch blatant and clear violations. I hope that the sandboxing mechanism introduced in OS X Lion will be brought to iOS which requires that developers provide justifications for the system features they use or else be prevented access. That does much to enhance the effectiveness of app review.
Apple should also be expected to remove apps from the store that violate policy in a timely manner. I was surprised that Path wasn’t pulled when it became clear they were in violation of sections 17.1 and 17.2 of the App Review Guidelines. I think such a move (even if they were allowed back after fixing the issue) would have done much to discourage this behavior in other developers. Setting an example that this type of data use is strictly forbidden.
Rather than adding security theatre to iOS by nagging the user about each access to their data, more precise wielding of the tools already in place would be a much more effective and user benefiting approach.